Newton methods for nonsmooth convex minimization: connections among U-Lagrangian, Riemannian Newton and SQP methods

نویسندگان

  • Scott A. Miller
  • Jérôme Malick
چکیده

This paper studies Newton-type methods for minimization of partly smooth convex functions. Sequential Newton methods are provided using local parameterizations obtained from U -Lagrangian theory and from Riemannian geometry. The Hessian based on the U -Lagrangian depends on the selection of a dual parameter g; by revealing the connection to Riemannian geometry, a natural choice of g emerges for which the two Newton directions coincide. This choice of g is also shown to be related to the least-squares multiplier estimate from a sequential quadratic programming (SQP) approach, and with this multiplier, SQP gives the same search direction as the Newton methods.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Newton Methods for Nonsmooth Convex Minimization: Connections between U-Lagrangian, Riemannian Newton and SQP Methods

This paper studies Newton-type methods for minimization of partly smooth convex functions. Sequential Newton methods are provided using local parameterizations obtained from U-Lagrangian theory and from Riemannian geometry. The Hessian based on the ULagrangian depends on the selection of a dual parameter g; by revealing the connection to Riemannian geometry, a natural choice of g emerges for wh...

متن کامل

Forward-backward Truncated Newton Methods for Convex Composite Optimization1

This paper proposes two proximal Newton-CG methods for convex nonsmooth optimization problems in composite form. The algorithms are based on a a reformulation of the original nonsmooth problem as the unconstrained minimization of a continuously differentiable function, namely the forward-backward envelope (FBE). The first algorithm is based on a standard line search strategy, whereas the second...

متن کامل

On the Performance of SQP Methods for Nonlinear Optimization

This paper concerns some practical issues associated with the formulation of sequential quadratic programming (SQP) methods for large-scale nonlinear optimization. SQP methods find an approximate solution of a sequence of quadratic programming (QP) subproblems in which a quadratic model of the objective function is minimized subject to the linearized constraints. Extensive numerical results are...

متن کامل

BFGS convergence to nonsmooth minimizers of convex functions

The popular BFGS quasi-Newton minimization algorithm under reasonable conditions converges globally on smooth convex functions. This result was proved by Powell in 1976: we consider its implications for functions that are not smooth. In particular, an analogous convergence result holds for functions, like the Euclidean norm, that are nonsmooth at the minimizer.

متن کامل

A Quasi-Newton Approach to Nonsmooth Convex Optimization A Quasi-Newton Approach to Nonsmooth Convex Optimization

We extend the well-known BFGS quasi-Newton method and its limited-memory variant (LBFGS) to the optimization of nonsmooth convex objectives. This is done in a rigorous fashion by generalizing three components of BFGS to subdifferentials: The local quadratic model, the identification of a descent direction, and the Wolfe line search conditions. We apply the resulting subLBFGS algorithm to L2-reg...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Math. Program.

دوره 104  شماره 

صفحات  -

تاریخ انتشار 2005